High-dimensional subset recovery in noise: Sparsified measurements without loss of statistical efficiency
نویسندگان
چکیده
We consider the problem of estimating the support of a vector β ∈ R based on observations contaminated by noise. A significant body of work has studied behavior of l1-relaxations when applied to measurement matrices drawn from standard dense ensembles (e.g., Gaussian, Bernoulli). In this paper, we analyze sparsified measurement ensembles, and consider the tradeoff between measurement sparsity, as measured by the fraction γ of non-zero entries, and the statistical efficiency, as measured by the minimal number of observations n required for exact support recovery with probability converging to one. Our main result is to prove that it is possible to let γ → 0 at some rate, yielding measurement matrices with a vanishing fraction of non-zeros per row while retaining the same statistical efficiency as dense ensembles. A variety of simulation results confirm the sharpness of our theoretical predictions.
منابع مشابه
A Sharp Sufficient Condition for Sparsity Pattern Recovery
Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...
متن کاملHigh-dimensional Variable Selection with Sparse Random Projections: Measurement Sparsity and Statistical Efficiency
We consider the problem of high-dimensional variable selection: given n noisy observations of a k-sparse vector β∗ ∈ Rp, estimate the subset of non-zero entries of β∗. A significant body of work has studied behavior of l1-relaxations when applied to random measurement matrices that are dense (e.g., Gaussian, Bernoulli). In this paper, we analyze sparsified measurement ensembles, and consider th...
متن کاملFast global convergence of gradient methods for high-dimensional statistical recovery
Many statistical M -estimators are based on convex optimization problems formed by the combination of a data-dependent loss function with a norm-based regularizer. We analyze the convergence rates of projected gradient and composite gradient methods for solving such problems, working within a high-dimensional framework that allows the data dimension d to grow with (and possibly exceed) the samp...
متن کاملSub-linear Time Compressed Sensing for Support Recovery using Sparse-Graph Codes
We address the problem of robustly recovering the support of high-dimensional sparse signals1 from linear measurements in a low-dimensional subspace. We introduce a new compressed sensing framework through carefully designed sparse measurement matrices associated with low measurement costs and low-complexity recovery algorithms. The measurement system in our framework captures observations of t...
متن کاملWhen in Doubt, SWAP: High-Dimensional Sparse Recovery from Correlated Measurements
We consider the problem of accurately estimating a high-dimensional sparse vector using a small number of linear measurements that are contaminated by noise. It is well known that standard computationally tractable sparse recovery algorithms, such as the Lasso, OMP, and their various extensions, perform poorly when the measurement matrix contains highly correlated columns. We develop a simple g...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/0805.3005 شماره
صفحات -
تاریخ انتشار 2008